Welcome everybody to our pattern recognition symposium!
My name is Andreas Meyer and I am the host of the symposium and we,
every 6 months, have this workshop where all of our PhD students are presenting on their
research
and give an update what they have been working on over the past half year.
This is also accompanied with invited talks
and on this pattern recognition symposium I'm very proud that we have welcomed
Garou Marcus from Robust AI and Pim de Haan from the University of Amsterdam.
And in this first video on our pattern recognition symposium I want to feature
the presentation by Pim de Haan. He is a second year PhD student of the University
of Amsterdam and a research associate at Qualcomm AI research. Under the
supervision of Max Welling he works on building machine learning methods that
take into account the geometry and symmetries of the domain using
mathematics of groups, representations and categories. Prior to his PhD Pim was
a visiting researcher at UC Berkeley's Robotics and AI lab and obtained a
master's degree in artificial intelligence in Amsterdam and in
theoretical physics at the University of Cambridge. So today Pim will show us a
bit on his latest work on natural graph networks and how to use basic category
theory to make graph networks more expressive. Pim it's a great pleasure to
have you here and I'm very much looking forward to your presentation. So Pim the
stage is yours.
Thanks for the introduction, thanks for the invite. So I'll be talking about
the work we presented at NURBS last year called natural graph networks
and in this talk I'll try to introduce some basic concepts from category
theory and I'll try to show how we can use these to make more expressive graph
neural networks and along the way we'll be using some techniques that have been
used with quite some success to make agrarian neural networks on the images
and we'll see how they can be applied to graphs and this has been a
collaboration with wonderful help from Taka Kohan and Max Benning at the
University of Amsterdam and Qualcomm AI research and during this talk I'll be
first looking into kind of a basic building block for a lot of graph neural
networks that are around these states which are based on message passing, also
called message passing networks and I'll give you an example of where they are
lacking, what are problems that they cannot solve and thus where there may be
room for improvement and then I'll set up a bit of notation about what do we mean
by graphs and what are the symmetries of graphs and then we'll look at prior work
called equivariant graph networks that try to use the symmetries of graphs to
make graph networks more expressive and I'll show you where we can generalize
over these generalizing equivariant to something called a natural
transformation to build a global version of our natural graph networks and
lastly we'll talk about a local version of our natural graph network that is
also scalable to larger graphs which is a property we generally desire.
So as you probably know neural networks on graphs have experienced a massive
boom in popularity in the last five years or so and that's basically I guess
because a lot of problems can be reformulated or are already naturally
formulated in terms of information processing on graphs. So you can think of
friendships and some social network being represented as a graph and think
of molecules being represented by a molecular graph or even 3D shapes being
represented by the graph of vertices and edges of a 3D mesh. So all these
problems if you have good neural networks that process information on
Presenters
Zugänglich über
Offener Zugang
Dauer
00:52:03 Min
Aufnahmedatum
2021-03-02
Hochgeladen am
2021-03-02 08:46:37
Sprache
en-US
Title: Natural Graph Networks
Bio: Pim de Haan is a second year PhD student at the University of Amsterdam and a research associate at Qualcomm AI research. Under supervision of Max Welling, he works on building machine learning methods that take into account the geometry and symmetries of the domain, using the mathematics of groups, representations and categories. Prior to his PhD, Pim was a visiting researcher at UC Berkeley's Robotics and AI Lab and obtained master's degrees in artificial intelligence in Amsterdam and in theoretical physics at the University of Cambridge.
Abstract A key requirement for graph neural networks is that they must process a graph in a way that does not depend on how the graph is described. Traditionally this has been taken to mean that a graph network must be equivariant to node permutations. Here we show that instead of equivariance, the more general concept of naturality is sufficient for a graph network to be well-defined, opening up a larger class of graph networks. We define global and local natural graph networks, the latter of which are as scalable as conventional message passing graph neural networks while being more flexible. We give one practical instantiation of a natural network on graphs which uses an equivariant message network parameterization, yielding good performance on several benchmarks.
Paper: https://arxiv.org/abs/2007.08349
This video is released under CC BY 4.0. Please feel free to share and reuse. For reminders to watch the new video follow on Twitter https://twitter.com/maier_ak or LinkedIn https://www.linkedin.com/in/andreas-maier-a6870b1a6/. Also, join our network for information about talks, videos, and job offers in our Facebook and LinkedIn Groups https://lme.tf.fau.de/lab/join-the-pattern-recognition-lab/.
Music Reference:
Damiano Baldoni - Thinking of You (Intro) https://freemusicarchive.org/music/Damiano_Baldoni/Old_Beat/Thinking_of_you_1513
Damiano Baldoni - A Ghra (Outro) https://freemusicarchive.org/music/Damiano_Baldoni/Lost_Dinasty/A_GhrO